Console Output
Training and evaluating model for: PC
Dataset length: 20063 windows
NILMModel(
(conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
(lstm): LSTM(9, 128, num_layers=4, batch_first=True, dropout=0.1)
(dropout): Dropout(p=0.1, inplace=False)
(relu): ReLU()
(output_layer): Linear(in_features=128, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.008040
Validation Loss: 0.007125
Epoch [2/300], Train Loss: 0.008025
Validation Loss: 0.007140
Epoch [3/300], Train Loss: 0.007984
Validation Loss: 0.007054
Epoch [4/300], Train Loss: 0.007899
Validation Loss: 0.007027
Epoch [5/300], Train Loss: 0.007878
Validation Loss: 0.007016
Epoch [6/300], Train Loss: 0.007830
Validation Loss: 0.007013
Epoch [7/300], Train Loss: 0.007850
Validation Loss: 0.007002
Epoch [8/300], Train Loss: 0.007830
Validation Loss: 0.007001
Epoch [9/300], Train Loss: 0.007835
Validation Loss: 0.006992
Epoch [10/300], Train Loss: 0.007870
Validation Loss: 0.006997
Epoch [11/300], Train Loss: 0.007848
Validation Loss: 0.006990
Epoch [12/300], Train Loss: 0.007863
Validation Loss: 0.006996
Epoch [13/300], Train Loss: 0.007799
Validation Loss: 0.007010
Epoch [14/300], Train Loss: 0.007792
Validation Loss: 0.006982
Epoch [15/300], Train Loss: 0.007802
Validation Loss: 0.006979
Epoch [16/300], Train Loss: 0.007784
Validation Loss: 0.006973
Epoch [17/300], Train Loss: 0.007812
Validation Loss: 0.007004
Epoch [18/300], Train Loss: 0.007784
Validation Loss: 0.007033
Epoch [19/300], Train Loss: 0.007771
Validation Loss: 0.006984
Epoch [20/300], Train Loss: 0.007813
Validation Loss: 0.006970
Epoch [21/300], Train Loss: 0.007776
Validation Loss: 0.006968
Epoch [22/300], Train Loss: 0.007800
Validation Loss: 0.006961
Epoch [23/300], Train Loss: 0.007778
Validation Loss: 0.006950
Epoch [24/300], Train Loss: 0.007754
Validation Loss: 0.006998
Epoch [25/300], Train Loss: 0.007750
Validation Loss: 0.006925
Epoch [26/300], Train Loss: 0.007733
Validation Loss: 0.006960
Epoch [27/300], Train Loss: 0.007709
Validation Loss: 0.006861
Epoch [28/300], Train Loss: 0.007651
Validation Loss: 0.006791
Epoch [29/300], Train Loss: 0.007568
Validation Loss: 0.007294
Epoch [30/300], Train Loss: 0.007719
Validation Loss: 0.006595
Epoch [31/300], Train Loss: 0.007577
Validation Loss: 0.006512
Epoch [32/300], Train Loss: 0.007717
Validation Loss: 0.007033
Epoch [33/300], Train Loss: 0.007599
Validation Loss: 0.006329
Epoch [34/300], Train Loss: 0.007213
Validation Loss: 0.006163
Epoch [35/300], Train Loss: 0.007044
Validation Loss: 0.006277
Epoch [36/300], Train Loss: 0.006910
Validation Loss: 0.006095
Epoch [37/300], Train Loss: 0.006844
Validation Loss: 0.006274
Epoch [38/300], Train Loss: 0.006773
Validation Loss: 0.006458
Epoch [39/300], Train Loss: 0.006864
Validation Loss: 0.006282
Epoch [40/300], Train Loss: 0.007591
Validation Loss: 0.005976
Epoch [41/300], Train Loss: 0.006670
Validation Loss: 0.006126
Epoch [42/300], Train Loss: 0.006704
Validation Loss: 0.005798
Epoch [43/300], Train Loss: 0.006717
Validation Loss: 0.005778
Epoch [44/300], Train Loss: 0.006350
Validation Loss: 0.005920
Epoch [45/300], Train Loss: 0.006263
Validation Loss: 0.005890
Epoch [46/300], Train Loss: 0.006230
Validation Loss: 0.006037
Epoch [47/300], Train Loss: 0.006207
Validation Loss: 0.005579
Epoch [48/300], Train Loss: 0.006229
Validation Loss: 0.005810
Epoch [49/300], Train Loss: 0.006149
Validation Loss: 0.005713
Epoch [50/300], Train Loss: 0.005972
Validation Loss: 0.005667
Epoch [51/300], Train Loss: 0.006132
Validation Loss: 0.005453
Epoch [52/300], Train Loss: 0.006082
Validation Loss: 0.006517
Epoch [53/300], Train Loss: 0.005988
Validation Loss: 0.005370
Epoch [54/300], Train Loss: 0.005968
Validation Loss: 0.005819
Epoch [55/300], Train Loss: 0.006075
Validation Loss: 0.005549
Epoch [56/300], Train Loss: 0.005926
Validation Loss: 0.005241
Epoch [57/300], Train Loss: 0.005954
Validation Loss: 0.005735
Epoch [58/300], Train Loss: 0.005901
Validation Loss: 0.005408
Epoch [59/300], Train Loss: 0.005796
Validation Loss: 0.005335
Epoch [60/300], Train Loss: 0.005795
Validation Loss: 0.005108
Epoch [61/300], Train Loss: 0.005710
Validation Loss: 0.005102
Epoch [62/300], Train Loss: 0.005654
Validation Loss: 0.005308
Epoch [63/300], Train Loss: 0.005789
Validation Loss: 0.005120
Epoch [64/300], Train Loss: 0.005719
Validation Loss: 0.005204
Epoch [65/300], Train Loss: 0.005747
Validation Loss: 0.005065
Epoch [66/300], Train Loss: 0.005723
Validation Loss: 0.005313
Epoch [67/300], Train Loss: 0.005796
Validation Loss: 0.005103
Epoch [68/300], Train Loss: 0.005627
Validation Loss: 0.005031
Epoch [69/300], Train Loss: 0.005712
Validation Loss: 0.005240
Epoch [70/300], Train Loss: 0.005719
Validation Loss: 0.005039
Epoch [71/300], Train Loss: 0.005785
Validation Loss: 0.005450
Epoch [72/300], Train Loss: 0.005703
Validation Loss: 0.005149
Epoch [73/300], Train Loss: 0.005616
Validation Loss: 0.005431
Epoch [74/300], Train Loss: 0.005593
Validation Loss: 0.004921
Epoch [75/300], Train Loss: 0.005524
Validation Loss: 0.004911
Epoch [76/300], Train Loss: 0.005600
Validation Loss: 0.005390
Epoch [77/300], Train Loss: 0.005524
Validation Loss: 0.005094
Epoch [78/300], Train Loss: 0.005533
Validation Loss: 0.005199
Epoch [79/300], Train Loss: 0.005702
Validation Loss: 0.004919
Epoch [80/300], Train Loss: 0.005432
Validation Loss: 0.004965
Epoch [81/300], Train Loss: 0.005473
Validation Loss: 0.005298
Epoch [82/300], Train Loss: 0.005747
Validation Loss: 0.004935
Epoch [83/300], Train Loss: 0.005447
Validation Loss: 0.005738
Epoch [84/300], Train Loss: 0.005513
Validation Loss: 0.004952
Epoch [85/300], Train Loss: 0.005449
Validation Loss: 0.004846
Epoch [86/300], Train Loss: 0.005426
Validation Loss: 0.004891
Epoch [87/300], Train Loss: 0.005410
Validation Loss: 0.004931
Epoch [88/300], Train Loss: 0.005582
Validation Loss: 0.005113
Epoch [89/300], Train Loss: 0.005382
Validation Loss: 0.005350
Epoch [90/300], Train Loss: 0.005371
Validation Loss: 0.005243
Epoch [91/300], Train Loss: 0.005361
Validation Loss: 0.004815
Epoch [92/300], Train Loss: 0.005462
Validation Loss: 0.004813
Epoch [93/300], Train Loss: 0.005330
Validation Loss: 0.005521
Epoch [94/300], Train Loss: 0.005386
Validation Loss: 0.004969
Epoch [95/300], Train Loss: 0.005330
Validation Loss: 0.004767
Epoch [96/300], Train Loss: 0.005373
Validation Loss: 0.004801
Epoch [97/300], Train Loss: 0.005406
Validation Loss: 0.004947
Epoch [98/300], Train Loss: 0.005451
Validation Loss: 0.004757
Epoch [99/300], Train Loss: 0.005224
Validation Loss: 0.004776
Epoch [100/300], Train Loss: 0.005303
Validation Loss: 0.005131
Epoch [101/300], Train Loss: 0.005410
Validation Loss: 0.004777
Epoch [102/300], Train Loss: 0.005383
Validation Loss: 0.004716
Epoch [103/300], Train Loss: 0.005241
Validation Loss: 0.004731
Epoch [104/300], Train Loss: 0.005359
Validation Loss: 0.004685
Epoch [105/300], Train Loss: 0.005287
Validation Loss: 0.004727
Epoch [106/300], Train Loss: 0.005338
Validation Loss: 0.004961
Epoch [107/300], Train Loss: 0.005387
Validation Loss: 0.004718
Epoch [108/300], Train Loss: 0.005319
Validation Loss: 0.004743
Epoch [109/300], Train Loss: 0.005330
Validation Loss: 0.004731
Epoch [110/300], Train Loss: 0.005254
Validation Loss: 0.005440
Epoch [111/300], Train Loss: 0.005267
Validation Loss: 0.004755
Epoch [112/300], Train Loss: 0.005150
Validation Loss: 0.004643
Epoch [113/300], Train Loss: 0.005230
Validation Loss: 0.004768
Epoch [114/300], Train Loss: 0.005298
Validation Loss: 0.005019
Epoch [115/300], Train Loss: 0.005234
Validation Loss: 0.004855
Epoch [116/300], Train Loss: 0.005174
Validation Loss: 0.004794
Epoch [117/300], Train Loss: 0.006468
Validation Loss: 0.006371
Epoch [118/300], Train Loss: 0.005658
Validation Loss: 0.004832
Epoch [119/300], Train Loss: 0.005349
Validation Loss: 0.004736
Epoch [120/300], Train Loss: 0.005169
Validation Loss: 0.004879
Epoch [121/300], Train Loss: 0.005216
Validation Loss: 0.005080
Epoch [122/300], Train Loss: 0.005182
Validation Loss: 0.004667
Early stopping triggered
Evaluating model for: PC
Validation MAE: 6.149936 W
Validation MSE: 215.432922 W²
Validation RMSE: 14.677633 W
Signal Aggregate Error (SAE): 0.000063
Normalized Disaggregation Error (NDE): 0.388786
Training and Validation Loss
Interactive Plot